AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Pure English Pretraining

# Pure English Pretraining

Academic Ds 9B
Apache-2.0
A 9-billion-parameter large language model based on the DeepSeek-V3 architecture, trained from scratch using a fully open-source and exclusively English dataset of over 350 billion tokens, specifically designed for open-source community development and debugging.
Large Language Model Transformers English
A
ByteDance-Seed
39
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase